652 research outputs found
Linear-Time Algorithms for Geometric Graphs with Sublinearly Many Edge Crossings
We provide linear-time algorithms for geometric graphs with sublinearly many
crossings. That is, we provide algorithms running in O(n) time on connected
geometric graphs having n vertices and k crossings, where k is smaller than n
by an iterated logarithmic factor. Specific problems we study include Voronoi
diagrams and single-source shortest paths. Our algorithms all run in linear
time in the standard comparison-based computational model; hence, we make no
assumptions about the distribution or bit complexities of edge weights, nor do
we utilize unusual bit-level operations on memory words. Instead, our
algorithms are based on a planarization method that "zeroes in" on edge
crossings, together with methods for extending planar separator decompositions
to geometric graphs with sublinearly many crossings. Incidentally, our
planarization algorithm also solves an open computational geometry problem of
Chazelle for triangulating a self-intersecting polygonal chain having n
segments and k crossings in linear time, for the case when k is sublinear in n
by an iterated logarithmic factor.Comment: Expanded version of a paper appearing at the 20th ACM-SIAM Symposium
on Discrete Algorithms (SODA09
On the Stability of Matter
A hypothesis of absolutely stable strange hadronic matter composed of
baryons, here denoted , is tested within many-body
calculations performed using the Relativistic Mean-Field approach. In our
calculations, we employed the interaction compatible with
the binding energy ~MeV given
by the phenomenological energy-independent interaction model by
Yamazaki and Akaishi (YA). We found that the binding energy per , as
well as the central density in many-body systems saturates for mass
number , leaving aggregates highly unstable against
strong interaction decay. Moreover, we confronted the YA interaction model with
kaonic atom data and found that it fails to reproduce the single-nucleon
absorption fractions at rest from bubble chamber experiments.Comment: Proceedings of the HYP2018 conference, Norfolk/Portsmouth, USA, June
24 - 29, 2018, submitted to AIP Conference Proceeding
Thermal error compensation of a 5-axis machine tool using indigenous temperature sensors and CNC integrated Python code validated with a machined test piece
Achieving high workpiece accuracy is the long-term goal of machine tool designers. There are many causes for workpiece inaccuracy, with thermal errors being the most common. Indirect compensation (using prediction models for thermal errors) is a promising strategy to reduce thermal errors without increasing machine tool costs. The modelling approach uses transfer functions to deal with this issue; it is an established dynamic method with a physical basis, and its modelling and calculation speed are suitable for real-time applications. This research presents compensation for the main internal and external heat sources affecting the 5-axis machine tool structure including spindle rotation, three linear axes movements, rotary C axis and time-varying environmental temperature influence, save for the cutting process. A mathematical model using transfer functions is implemented directly into the control system of a milling centre to compensate for thermal errors in real time using Python programming language. The inputs of the compensation algorithm are indigenous temperature sensors used primarily for diagnostic purposes in the machine. Therefore, no additional temperature sensors are necessary. This achieved a significant reduction in thermal errors in three machine directions X, Y and Z during verification testing lasting over 60 hours. Moreover, a thermal test piece was machined to verify the industrial applicability of the introduced approach. The results of the transfer function model compared with the machine tool’s multiple linear regression compensation model are discussed
Role of Hyperon Negative Energy Sea in Nuclear Matter
We have examined the contribution of the filled negative energy sea of
hyperons to the energy/particle in nuclear matter at the one and two loop
levels. While this has the potential to be significant, we find a strong
cancellation between the one and two loop contributions for our chosen
parameters so that hyperon effects can be justifiably neglected.Comment: 12 pages, latex, 1 simple figure attached at end (regular postscript
Data-Oblivious Graph Algorithms in Outsourced External Memory
Motivated by privacy preservation for outsourced data, data-oblivious
external memory is a computational framework where a client performs
computations on data stored at a semi-trusted server in a way that does not
reveal her data to the server. This approach facilitates collaboration and
reliability over traditional frameworks, and it provides privacy protection,
even though the server has full access to the data and he can monitor how it is
accessed by the client. The challenge is that even if data is encrypted, the
server can learn information based on the client data access pattern; hence,
access patterns must also be obfuscated. We investigate privacy-preserving
algorithms for outsourced external memory that are based on the use of
data-oblivious algorithms, that is, algorithms where each possible sequence of
data accesses is independent of the data values. We give new efficient
data-oblivious algorithms in the outsourced external memory model for a number
of fundamental graph problems. Our results include new data-oblivious
external-memory methods for constructing minimum spanning trees, performing
various traversals on rooted trees, answering least common ancestor queries on
trees, computing biconnected components, and forming open ear decompositions.
None of our algorithms make use of constant-time random oracles.Comment: 20 page
Automated Classification of Bioprocess Based on Optimum Compromise Whitening and Clustering
The proposed methodology of technological state classification is based on data smoothing, dimensionality reduction, compromise whitening, and optimum clustering. The novelty of our approach is in the stabile state hypothesis which improves initialization of c-mean algorithm and enables interleaved cross-validation strategy. We also employ the Akaike information criterion to obtain the optimum number of technological
states that minimize it, but using as many as possible clusters and components. The general approach is applied to state classification of Pseudomonas putida fed-batch cultivation on octanoic acid
- …